# Child Language Acquisition
Babyberta 1
A lightweight RoBERTa variant trained on 5 million American English child-directed corpora, specifically designed for language acquisition research
Large Language Model
Transformers English

B
phueb
295
3
Babyberta 3
MIT
BabyBERTa is a lightweight version based on RoBERTa, specifically designed for language acquisition research, trained on a 5-million-word corpus of American English child-directed input.
Large Language Model
Transformers English

B
phueb
22
0
Featured Recommended AI Models